Console Output
Training and evaluating model for: Microwave
Dataset length: 24915 windows
NILMModel(
(conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
(lstm): LSTM(9, 256, num_layers=4, batch_first=True, dropout=0.1)
(dropout): Dropout(p=0.1, inplace=False)
(relu): ReLU()
(output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.000811
Validation Loss: 0.000614
Epoch [2/300], Train Loss: 0.000684
Validation Loss: 0.000611
Epoch [3/300], Train Loss: 0.000679
Validation Loss: 0.000607
Epoch [4/300], Train Loss: 0.000661
Validation Loss: 0.000568
Epoch [5/300], Train Loss: 0.000603
Validation Loss: 0.000575
Epoch [6/300], Train Loss: 0.000600
Validation Loss: 0.000496
Epoch [7/300], Train Loss: 0.000545
Validation Loss: 0.000558
Epoch [8/300], Train Loss: 0.000549
Validation Loss: 0.000483
Epoch [9/300], Train Loss: 0.000534
Validation Loss: 0.000484
Epoch [10/300], Train Loss: 0.000525
Validation Loss: 0.000479
Epoch [11/300], Train Loss: 0.000521
Validation Loss: 0.000463
Epoch [12/300], Train Loss: 0.000509
Validation Loss: 0.000430
Epoch [13/300], Train Loss: 0.000484
Validation Loss: 0.000439
Epoch [14/300], Train Loss: 0.000510
Validation Loss: 0.000445
Epoch [15/300], Train Loss: 0.000475
Validation Loss: 0.000406
Epoch [16/300], Train Loss: 0.000464
Validation Loss: 0.000406
Epoch [17/300], Train Loss: 0.000455
Validation Loss: 0.000385
Epoch [18/300], Train Loss: 0.000432
Validation Loss: 0.000379
Epoch [19/300], Train Loss: 0.000441
Validation Loss: 0.000383
Epoch [20/300], Train Loss: 0.000441
Validation Loss: 0.000384
Epoch [21/300], Train Loss: 0.000420
Validation Loss: 0.000361
Epoch [22/300], Train Loss: 0.000416
Validation Loss: 0.000360
Epoch [23/300], Train Loss: 0.000409
Validation Loss: 0.000354
Epoch [24/300], Train Loss: 0.000406
Validation Loss: 0.000345
Epoch [25/300], Train Loss: 0.000397
Validation Loss: 0.000343
Epoch [26/300], Train Loss: 0.000394
Validation Loss: 0.000363
Epoch [27/300], Train Loss: 0.000393
Validation Loss: 0.000339
Epoch [28/300], Train Loss: 0.000382
Validation Loss: 0.000340
Epoch [29/300], Train Loss: 0.000378
Validation Loss: 0.000325
Epoch [30/300], Train Loss: 0.000377
Validation Loss: 0.000333
Epoch [31/300], Train Loss: 0.000369
Validation Loss: 0.000331
Epoch [32/300], Train Loss: 0.000369
Validation Loss: 0.000317
Epoch [33/300], Train Loss: 0.000359
Validation Loss: 0.000308
Epoch [34/300], Train Loss: 0.000353
Validation Loss: 0.000318
Epoch [35/300], Train Loss: 0.000357
Validation Loss: 0.000316
Epoch [36/300], Train Loss: 0.000347
Validation Loss: 0.000303
Epoch [37/300], Train Loss: 0.000347
Validation Loss: 0.000298
Epoch [38/300], Train Loss: 0.000344
Validation Loss: 0.000294
Epoch [39/300], Train Loss: 0.000334
Validation Loss: 0.000304
Epoch [40/300], Train Loss: 0.000347
Validation Loss: 0.000294
Epoch [41/300], Train Loss: 0.000336
Validation Loss: 0.000317
Epoch [42/300], Train Loss: 0.000353
Validation Loss: 0.000290
Epoch [43/300], Train Loss: 0.000337
Validation Loss: 0.000314
Epoch [44/300], Train Loss: 0.000321
Validation Loss: 0.000281
Epoch [45/300], Train Loss: 0.000317
Validation Loss: 0.000274
Epoch [46/300], Train Loss: 0.000310
Validation Loss: 0.000283
Epoch [47/300], Train Loss: 0.000307
Validation Loss: 0.000271
Epoch [48/300], Train Loss: 0.000315
Validation Loss: 0.000272
Epoch [49/300], Train Loss: 0.000317
Validation Loss: 0.000285
Epoch [50/300], Train Loss: 0.000308
Validation Loss: 0.000268
Epoch [51/300], Train Loss: 0.000296
Validation Loss: 0.000268
Epoch [52/300], Train Loss: 0.000291
Validation Loss: 0.000272
Epoch [53/300], Train Loss: 0.000331
Validation Loss: 0.000276
Epoch [54/300], Train Loss: 0.000310
Validation Loss: 0.000262
Epoch [55/300], Train Loss: 0.000332
Validation Loss: 0.000312
Epoch [56/300], Train Loss: 0.000324
Validation Loss: 0.000278
Epoch [57/300], Train Loss: 0.000307
Validation Loss: 0.000271
Epoch [58/300], Train Loss: 0.000298
Validation Loss: 0.000268
Epoch [59/300], Train Loss: 0.000298
Validation Loss: 0.000261
Epoch [60/300], Train Loss: 0.000298
Validation Loss: 0.000258
Epoch [61/300], Train Loss: 0.000284
Validation Loss: 0.000253
Epoch [62/300], Train Loss: 0.000278
Validation Loss: 0.000244
Epoch [63/300], Train Loss: 0.000276
Validation Loss: 0.000251
Epoch [64/300], Train Loss: 0.000268
Validation Loss: 0.000251
Epoch [65/300], Train Loss: 0.000384
Validation Loss: 0.000295
Epoch [66/300], Train Loss: 0.000316
Validation Loss: 0.000259
Epoch [67/300], Train Loss: 0.000293
Validation Loss: 0.000254
Epoch [68/300], Train Loss: 0.000279
Validation Loss: 0.000245
Epoch [69/300], Train Loss: 0.000276
Validation Loss: 0.000254
Epoch [70/300], Train Loss: 0.000288
Validation Loss: 0.000254
Epoch [71/300], Train Loss: 0.000284
Validation Loss: 0.000236
Epoch [72/300], Train Loss: 0.000272
Validation Loss: 0.000236
Epoch [73/300], Train Loss: 0.000258
Validation Loss: 0.000230
Epoch [74/300], Train Loss: 0.000255
Validation Loss: 0.000227
Epoch [75/300], Train Loss: 0.000241
Validation Loss: 0.000207
Epoch [76/300], Train Loss: 0.000234
Validation Loss: 0.000204
Epoch [77/300], Train Loss: 0.000231
Validation Loss: 0.000224
Epoch [78/300], Train Loss: 0.000228
Validation Loss: 0.000206
Epoch [79/300], Train Loss: 0.000220
Validation Loss: 0.000197
Epoch [80/300], Train Loss: 0.000216
Validation Loss: 0.000204
Epoch [81/300], Train Loss: 0.000204
Validation Loss: 0.000189
Epoch [82/300], Train Loss: 0.000207
Validation Loss: 0.000178
Epoch [83/300], Train Loss: 0.000199
Validation Loss: 0.000296
Epoch [84/300], Train Loss: 0.000257
Validation Loss: 0.000191
Epoch [85/300], Train Loss: 0.000207
Validation Loss: 0.000180
Epoch [86/300], Train Loss: 0.000194
Validation Loss: 0.000182
Epoch [87/300], Train Loss: 0.000182
Validation Loss: 0.000170
Epoch [88/300], Train Loss: 0.000175
Validation Loss: 0.000166
Epoch [89/300], Train Loss: 0.000205
Validation Loss: 0.000198
Epoch [90/300], Train Loss: 0.000191
Validation Loss: 0.000158
Epoch [91/300], Train Loss: 0.000220
Validation Loss: 0.000215
Epoch [92/300], Train Loss: 0.000203
Validation Loss: 0.000161
Epoch [93/300], Train Loss: 0.000168
Validation Loss: 0.000154
Epoch [94/300], Train Loss: 0.000202
Validation Loss: 0.000155
Epoch [95/300], Train Loss: 0.000168
Validation Loss: 0.000145
Epoch [96/300], Train Loss: 0.000156
Validation Loss: 0.000144
Epoch [97/300], Train Loss: 0.000168
Validation Loss: 0.000147
Epoch [98/300], Train Loss: 0.000167
Validation Loss: 0.000149
Epoch [99/300], Train Loss: 0.000177
Validation Loss: 0.000201
Epoch [100/300], Train Loss: 0.000192
Validation Loss: 0.000146
Epoch [101/300], Train Loss: 0.000162
Validation Loss: 0.000141
Epoch [102/300], Train Loss: 0.000164
Validation Loss: 0.000159
Epoch [103/300], Train Loss: 0.000170
Validation Loss: 0.000175
Epoch [104/300], Train Loss: 0.000163
Validation Loss: 0.000140
Epoch [105/300], Train Loss: 0.000155
Validation Loss: 0.000150
Epoch [106/300], Train Loss: 0.000146
Validation Loss: 0.000128
Epoch [107/300], Train Loss: 0.000140
Validation Loss: 0.000125
Epoch [108/300], Train Loss: 0.000135
Validation Loss: 0.000130
Epoch [109/300], Train Loss: 0.000142
Validation Loss: 0.000158
Epoch [110/300], Train Loss: 0.000167
Validation Loss: 0.000126
Epoch [111/300], Train Loss: 0.000133
Validation Loss: 0.000130
Epoch [112/300], Train Loss: 0.000137
Validation Loss: 0.000165
Epoch [113/300], Train Loss: 0.000139
Validation Loss: 0.000128
Epoch [114/300], Train Loss: 0.000125
Validation Loss: 0.000111
Epoch [115/300], Train Loss: 0.000128
Validation Loss: 0.000114
Epoch [116/300], Train Loss: 0.000125
Validation Loss: 0.000138
Epoch [117/300], Train Loss: 0.000118
Validation Loss: 0.000136
Epoch [118/300], Train Loss: 0.000118
Validation Loss: 0.000138
Epoch [119/300], Train Loss: 0.000128
Validation Loss: 0.000245
Epoch [120/300], Train Loss: 0.000159
Validation Loss: 0.000171
Epoch [121/300], Train Loss: 0.000120
Validation Loss: 0.000118
Epoch [122/300], Train Loss: 0.000111
Validation Loss: 0.000127
Epoch [123/300], Train Loss: 0.000109
Validation Loss: 0.000097
Epoch [124/300], Train Loss: 0.000109
Validation Loss: 0.000104
Epoch [125/300], Train Loss: 0.000102
Validation Loss: 0.000092
Epoch [126/300], Train Loss: 0.000106
Validation Loss: 0.000110
Epoch [127/300], Train Loss: 0.000106
Validation Loss: 0.000094
Epoch [128/300], Train Loss: 0.000097
Validation Loss: 0.000089
Epoch [129/300], Train Loss: 0.000105
Validation Loss: 0.000101
Epoch [130/300], Train Loss: 0.000092
Validation Loss: 0.000093
Epoch [131/300], Train Loss: 0.000115
Validation Loss: 0.000160
Epoch [132/300], Train Loss: 0.000102
Validation Loss: 0.000088
Epoch [133/300], Train Loss: 0.000088
Validation Loss: 0.000087
Epoch [134/300], Train Loss: 0.000106
Validation Loss: 0.000142
Epoch [135/300], Train Loss: 0.000178
Validation Loss: 0.000136
Epoch [136/300], Train Loss: 0.000103
Validation Loss: 0.000124
Epoch [137/300], Train Loss: 0.000090
Validation Loss: 0.000086
Epoch [138/300], Train Loss: 0.000084
Validation Loss: 0.000086
Epoch [139/300], Train Loss: 0.000081
Validation Loss: 0.000085
Epoch [140/300], Train Loss: 0.000081
Validation Loss: 0.000084
Epoch [141/300], Train Loss: 0.000078
Validation Loss: 0.000085
Epoch [142/300], Train Loss: 0.000080
Validation Loss: 0.000088
Epoch [143/300], Train Loss: 0.000077
Validation Loss: 0.000085
Epoch [144/300], Train Loss: 0.000077
Validation Loss: 0.000075
Epoch [145/300], Train Loss: 0.000085
Validation Loss: 0.000080
Epoch [146/300], Train Loss: 0.000085
Validation Loss: 0.000080
Epoch [147/300], Train Loss: 0.000074
Validation Loss: 0.000076
Epoch [148/300], Train Loss: 0.000080
Validation Loss: 0.000084
Epoch [149/300], Train Loss: 0.000076
Validation Loss: 0.000075
Epoch [150/300], Train Loss: 0.000070
Validation Loss: 0.000088
Epoch [151/300], Train Loss: 0.000080
Validation Loss: 0.000072
Epoch [152/300], Train Loss: 0.000069
Validation Loss: 0.000072
Epoch [153/300], Train Loss: 0.000068
Validation Loss: 0.000077
Epoch [154/300], Train Loss: 0.000067
Validation Loss: 0.000077
Epoch [155/300], Train Loss: 0.000066
Validation Loss: 0.000081
Epoch [156/300], Train Loss: 0.000087
Validation Loss: 0.000082
Epoch [157/300], Train Loss: 0.000068
Validation Loss: 0.000089
Epoch [158/300], Train Loss: 0.000070
Validation Loss: 0.000073
Epoch [159/300], Train Loss: 0.000074
Validation Loss: 0.000074
Epoch [160/300], Train Loss: 0.000065
Validation Loss: 0.000070
Epoch [161/300], Train Loss: 0.000064
Validation Loss: 0.000066
Epoch [162/300], Train Loss: 0.000062
Validation Loss: 0.000065
Epoch [163/300], Train Loss: 0.000061
Validation Loss: 0.000065
Epoch [164/300], Train Loss: 0.000060
Validation Loss: 0.000071
Epoch [165/300], Train Loss: 0.000065
Validation Loss: 0.000074
Epoch [166/300], Train Loss: 0.000060
Validation Loss: 0.000070
Epoch [167/300], Train Loss: 0.000060
Validation Loss: 0.000070
Epoch [168/300], Train Loss: 0.000058
Validation Loss: 0.000065
Epoch [169/300], Train Loss: 0.000057
Validation Loss: 0.000068
Epoch [170/300], Train Loss: 0.000057
Validation Loss: 0.000065
Epoch [171/300], Train Loss: 0.000056
Validation Loss: 0.000069
Epoch [172/300], Train Loss: 0.000055
Validation Loss: 0.000065
Epoch [173/300], Train Loss: 0.000083
Validation Loss: 0.000122
Early stopping triggered
Evaluating model for: Microwave
Validation MAE: 2.487846 W
Validation MSE: 1106.452271 W²
Validation RMSE: 33.263378 W
Signal Aggregate Error (SAE): 0.019418
Normalized Disaggregation Error (NDE): 0.328955
Training and Validation Loss
Interactive Plot